Information Geometry and Minimum Description Length Networks

نویسندگان

  • Ke Sun
  • Jun Wang
  • Alexandros Kalousis
  • Stéphane Marchand-Maillet
چکیده

We study parametric unsupervised mixture learning. We measure the loss of intrinsic information from the observations to complex mixture models, and then to simple mixture models. We present a geometric picture, where all these representations are regarded as free points in the space of probability distributions. Based on minimum description length, we derive a simple geometric principle to learn all these models together. We present a new learning machine with theories, algorithms, and simulations.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Spatial Analysis in curved spaces with Non-Euclidean Geometry

The ultimate goal of spatial information, both as part of technology and as science, is to answer questions and issues related to space, place, and location. Therefore, geometry is widely used for description, storage, and analysis. Undoubtedly, one of the most essential features of spatial information is geometric features, and one of the most obvious types of analysis is the geometric type an...

متن کامل

Learning the structure of a Bayesian network: An application of Information Geometry and the Minimum Description Length Principle

The field of Bayesian Networks has had an enormous development over the last few years and is one of the current key topics of research in the design of statistical machine learning and data mining algorithms. Bayesian networks are a natural marriage between two areas in mathematics: graph theory and probability theory. A Bayesian net encodes the probability distribution of a set of attributes ...

متن کامل

Minimum Energy Information Fusion in Sensor Networks

In this paper we consider how to organize the sharing of information in a distributed network of sensors and data processors so as to provide explanations for sensor readings with minimal expenditure of energy. We point out that the Minimum Description Length principle provides an approach to information fusion that is more naturally suited to energy minimization than traditional Bayesian appro...

متن کامل

MDL Procedures with ` 1 Penalty and their Statistical Risk Updated August 15 , 2008 Andrew

We review recently developed theory for the Minimum Description Length principle, penalized likelihood and its statistical risk. An information theoretic condition on a penalty pen(f) yields the conclusion that the optimizer of the penalized log likelihood criterion log 1/likelihood(f) + pen(f) has risk not more than the index of resolvability, corresponding to the accuracy of the optimizer of ...

متن کامل

MDL Procedures with `1 Penalty and their Statistical Risk

We review recently developed theory for the Minimum Description Length principle, penalized likelihood and its statistical risk. An information theoretic condition on a penalty pen(f) yields the conclusion that the optimizer of the penalized log likelihood criterion log 1/likelihood(f) + pen(f) has risk not more than the index of resolvability, corresponding to the accuracy of the optimizer of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015